A Class of Globally Convergent Optimization Methods Based on Conservative Convex Separable Approximations

نویسنده

  • Krister Svanberg
چکیده

This paper deals with a certain class of optimization methods, based on conservative convex separable approximations (CCSA), for solving inequality-constrained nonlinear programming problems. Each generated iteration point is a feasible solution with lower objective value than the previous one, and it is proved that the sequence of iteration points converges toward the set of Karush–Kuhn–Tucker points. A major advantage of CCSA methods is that they can be applied to problems with a very large number of variables (say 104–105) even if the Hessian matrices of the objective and constraint functions are dense.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A spectral updating for the method of moving asymptotes

1. Abstract The Method of Moving Asymptotes (MMA) is very popular within the structural optimization community and applies for inequality constrained nonlinear programming problems with simple bounds. In its more recent version (Svanberg, 2002), MMA was merged into the Conservative, Convex and Separable Approximation (CCSA) class of algorithms, which are globally convergent. In this work a modi...

متن کامل

A parameterized proximal point algorithm for separable convex optimization

In this paper, we develop a parameterized proximal point algorithm (PPPA) for solving a class of separable convex programming problems subject to linear and convex constraints. The proposed algorithm is provable to be globally convergent with a worst-case O(1/t) convergence rate, where t denotes the iteration number. By properly choosing the algorithm parameters, numerical experiments on solvin...

متن کامل

On the conditional acceptance of iterates in SAO algorithms based on convex separable approximations

We reflect on the convergence and termination of optimization algorithms based on convex and separable approximations using two recently proposed strategies, namely a trust region with filtered acceptance of the iterates, and conservatism. We then propose a new strategy for convergence and termination, denoted f iltered conservatism, in which the acceptance or rejection of an iterate is determi...

متن کامل

An interior-point Lagrangian decomposition method for separable convex optimization

In this paper we propose a distributed algorithm for solving large-scale separable convex problems using Lagrangian dual decomposition and the interior-point framework. By adding self-concordant barrier terms to the ordinary Lagrangian we prove under mild assumptions that the corresponding family of augmented dual functions is self-concordant. This makes it possible to efficiently use the Newto...

متن کامل

Generalized Kernel Classification and Regression

We discuss kernel-based classification and regression in a general context, emphasizing the role of convex duality in the problem formulation. We give conditions for the existence of the dual problem, and derive general globally convergent classification and regression algorithms for solving the true (i.e. hard-margin or rigorous) dual problem without resorting to approximations.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 12  شماره 

صفحات  -

تاریخ انتشار 2002